Pragmatic, formative process evaluations of complex interventions and why we need more of them.
نویسندگان
چکیده
Recently published guidance on process evaluations by the Medical Research Council’s (MRC’s) Population Health Sciences Research Network (PHSRN) marks a significant advance in the evaluation of complex public health interventions. 2 In presenting programmes as not just a set of mechanisms of change across multiple socioecological domains, but as an interaction of theory, context and implementation, the guidance extends the remit of evaluation and forces us to reassess the responsiveness of existing methodologies and frameworks. Process evaluations have emerged as vital instruments in reacting to these changing needs, through: the modelling of causal mechanisms; the identification of salient contextual influences; and the monitoring of fidelity and adaptations, which permits the circumvention of type 3 errors. While the guidance offers an instructive set of standards, the authors’ acknowledgement that there is no such thing as a ‘typical’ process evaluation ensures continued scope for debate and development around this framework. Specifically, the predominant focus on embedding process evaluations within definitive effectiveness trials encourages further theoretical and practical exploration of formative process evaluation. This approach is defined by the preclinical and first phase of the MRC’s guidance on the development and evaluation of complex interventions. 5 The preclinical phase involves the development of the intervention’s theoretical rationale, primarily through consultation of the relevant literature. Meanwhile, phase 1 focuses on the modelling of processes and outcomes in order to identify underpinning active ingredients and delineate how intervention components combine synergistically to generate outcomes. One particular conceptual space that needs to be carved out is that of pragmatic formative process evaluation. This may be defined as the application of formative process evaluation criteria to interventions that have ostensibly been formulated, and are likely in routine practice, but have not been subjected to rigorous scientific development and evaluation. These interventions are often distinguishable by their lack of a robust evidence base. The term pragmatic formative process evaluation is used here with intent, in order to achieve consistency in terminology with pragmatic policy trials, and natural experiments to a lesser extent. These approaches also take advantage of expedient evaluation opportunities within real world settings, and are often integrated into the process of disseminating interventions or programmes of legislation, with randomisation being nested within roll-out. Focus on the development of pragmatic formative process evaluations is largely justified by the abundance of widely practised but non-evidence-based complex approaches in public health. Explanations for this occurrence include: the presumed irrelevance of social equipoise; dissonant policy and research timescales; and the perception that an intervention will not confer harm. 7 However, such interventions are often left out of discussions around formative evaluation. Indeed, the MRC’s guidance claims that if an intervention is already widely delivered, a modelling and testing phase may often not be essential. Yet, for many practiced interventions the rigorous process of theoretical development and interrogation of implicit causal assumptions may still need to be conducted. Moreover, even where some understanding of the theory of change is present, it is unlikely that the unintended consequence of interventions will have been sufficiently theorised and empirically explored. For example, our recent pragmatic formative process evaluation of a school-based social and emotional learning intervention, which had been recommended by the Welsh school inspectorate as best practice in managing challenging behaviour, indicated a number of potential iatrogenic effects due to a stigmatising and negative targeting process. The research frameworks and methodologies employed as part of pragmatic formative process evaluations will likely reflect those used with formative evaluations of novel interventions during the preclinical and first phase of the MRC guidance. This should include (systematic) review to identify the existing evidence base in order to theorise and verify the intervention’s active ingredients. Effective consultation with programme developers is of paramount importance, so as to elicit their knowledge, assumptions and understandings around intervention theory. Inclusion of relevant stakeholders and target populations, largely through qualitative research, is also required to understand contextual influences, unravel implementation procedures, and predict feasibility and acceptability. This understanding can help to mitigate implementation practices that comprise theoretical integrity, while allowing interventions to be responsive to specific contextual needs. There are a range of frames that can be drawn on from implementation science to theorise the delivery of interventions within real world settings, with Rogers’ diffusion of innovations theory gaining increased currency. There is now a wealth of literature to support the process of knowledge exchange between research and practice, which can be exploited to enhance the conduct of pragmatic formative process evaluations. However, there remains a propensity to treat this exchange as unidirectional, with knowledge being disseminated from research to practice. Pragmatic formative evaluation demands a more cyclical understanding of knowledge transfer, with a particular focus on enhancing the methods and modalities that can support the translation of practice-based knowledge and ideas to research. Emerging examples of frameworks that may be adapted to enhance translational research include Spoth et al’s TSCI Impact Framework. In finding ways of working with frames and methods that foreground the voices of programme developers as well as practitioners, and emphasise the importance of knowledge coproduction, pragmatic formative evaluations have the capacity to contribute towards interventions that have high external validity with more sustainable implementation practices. Continued development of process evaluation is necessary if public health researchers are to understand and capture messy realities, and to respond to them through effective intervention. To this end, it is important not to condemn DECIPHer, School of Social Sciences, Cardiff University, Cardiff, UK
منابع مشابه
A Pragmatic Solution to the Value Problem of Knowledge
We value possessing knowledge more than true belief. Both someone with knowledge and someone with a true belief possess the correct answer to a question. Why is knowledge more valuable than true belief if both contain the correct answer? I examine the philosophy of American pragmatist John Dewey and then I offer a novel solution to this question often called the value problem of knowledge. I pr...
متن کاملIt Is Not That Simple nor Compelling!; Comment on “Translating Evidence Into Healthcare Policy and Practice: Single Versus Multi-faceted Implementation Strategies – Is There a Simple Answer to a Complex Question?”
Healthcare decisions are often made under pressure, with varying levels of information in a changing clinical context. With limited resources and a focus on improving patient outcomes, healthcare managers and health professionals strive to implement both clinical and cost-effective care. However, the gap between research evidence and health policy/clinical practice persists despite our best eff...
متن کاملA Case Report of A Persian Patient with Crossed Aphasia: Agrammatism after Right Hemisphere Lesion
Crossed aphasia in dextral (CAD) refers to aphasia occurring after right brain damage in dextral persons. CAD is a rare phenomenon in the world and there has not been any report of crossed aphasia in Persian, that is why we measured to report a Persian patient with crossed aphasia and this is a first report of incidence of CA in Persian. In this case report study, we offered a complete report o...
متن کاملMultiple and mixed methods in formative evaluation: Is more better? Reflections from a South African study
BACKGROUND Formative programme evaluations assess intervention implementation processes, and are seen widely as a way of unlocking the 'black box' of any programme in order to explore and understand why a programme functions as it does. However, few critical assessments of the methods used in such evaluations are available, and there are especially few that reflect on how well the evaluation ac...
متن کاملWhy Should We Have a Periodic Safety and Performance Program for Medical Devices
Nowadays, more than 10,000 different types of medical devices can be found in hospitals.These devices used in medical centers and hospitals for monitoring and treatment of patients require periodic safety and performance checking in order to have confidence in their functioning and operation. Physicians need better accurate medical measurements in order to better diagnose diseases, monitor pati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Journal of epidemiology and community health
دوره 69 10 شماره
صفحات -
تاریخ انتشار 2015